If BL Systems was the idea, the Central Data Centre was the proof, as Keith Adams relates in Part Two of this three-part story.
This is the story of how British Leyland quietly built a networked computing operation that reached from the boardroom to the factory floor, decades ahead of its time.
The BL Systems story – the Data Centre years

If BL Systems was the idea, the Central Data Centre (above) was the proof.
By the late 1970s, British Leyland had accepted that computing could no longer be treated as a collection of local utilities scattered across its factories. The machines were too expensive, the data too valuable and the duplication too great. What was needed was a single centre capable of running the company’s core systems and feeding information back to the factory floor in near real time.
For Chris Chiles (below), later Chief Executive of ISTEL, and brought in to rationalise BL’s sprawling technical estate, the logic was obvious. ‘Those machines were three-million-pound assets,’ he recalled. ‘And that was just the hardware. You had air conditioning, power, specialist staff. The only way to make sense of it was to centralise.’
Building the Central Data Centre
The result was the Central Data Centre, built on a greenfield site at Redditch after planning restrictions ruled out Solihull. It was conceived not as a passive data warehouse, but as an operational hub. IBM mainframes sat alongside large numbers of DEC VAX machines, supported by industrial-scale power supplies and cooling systems.
‘People don’t realise this now,’ John Leighfield later observed, ‘but we became the biggest commercial user of DEC VAX systems anywhere in the UK. Even DEC didn’t realise that until we told them.’
The scale of the task went far beyond construction. Processing had to be transferred from Cowley, Longbridge, Solihull, Jaguar and other sites without interrupting payroll, production planning or logistics. ‘You couldn’t just turn things off and move them,’ Chiles said. ‘Everything had to keep running while you did it. That was the real challenge.’

Connecting the factories
If computing power was one obstacle, communications were another entirely. Britain’s telecoms infrastructure simply could not support what BL Systems needed. ‘BT could offer us 64 kilobits if we were lucky,’ Chiles recalled. ‘And delivery times were measured in years. That just wasn’t going to work.’
The solution was audacious. By a quirk of history, the old BMC had been granted a virtually unique dispensation from the Post Office monopoly, but had never exploited it. BL Systems persuaded the BL Board to let it turn that dormant permission into its own microwave communications network, linking factories across the Midlands and Oxfordshire, and onward over the Chilterns to London.
‘We knew what needed to be done,’ Chiles said. ‘The technology wasn’t quite there, so we adapted what we could get.’ What emerged was a private industrial network, decades ahead of its time.
Computing meets the production line
At Longbridge, this approach culminated in a system known internally as Big C, developed to manage Austin Metro production (below). ‘It balanced the line,’ Leighfield explained. ‘It decided what to build next based on what was actually happening, not what somebody hoped was happening.’ Bodies no longer sat idle long enough to corrode. Cars were built based on individual customer priorities. Throughput improved, and quality followed.
Elsewhere, BL Systems Engineers took computing deeper into manufacturing still. At Swindon’s Press Shop, strain gauges were fitted to presses to monitor force and stroke behaviour automatically. ‘That wasn’t just finance or payroll systems,’ Leighfield noted. ‘That was getting right into the heart of production.’

Seeing the future before building it
One of the most far-reaching developments came when operational research met emerging desktop computing. Manufacturing flows that had previously been demonstrated using physical models and toy cars could now be simulated digitally. ‘A lot of this theory came out of wartime operational research,’ Leighfield said. ‘What changed was that we finally had the computing power to apply it properly.’
The result was software originally known as SeeWhy, later renamed Witness. It allowed Engineers to model assembly lines, logistics systems and entire factories before a single machine was installed. ‘You could change a layout on screen and see immediately what happened,’ Leighfield explained. ‘It showed you why problems occurred, not just that they did.’
The software quickly escaped the motor industry. ‘It ended up being used all over the world,’ Chiles recalled, ‘including for things like airport terminals.’ However, its origins lay firmly in British Leyland’s need to understand how cars moved through its factories.

A place people wanted to be
Inside the CDC (above), the atmosphere reflected that sense of purpose. For Dave Handley, a Project Manager who moved from Longbridge operations into the centre, it felt transformative. ‘Everybody wanted to get there,’ he said. ‘People gave up shift allowances and took pay cuts. That’s where the future was.’
Tours were frequent, and the reaction was always the same. ‘It was the scale of it,’ Handley recalled. ‘There was nothing else like it. People were genuinely stunned.’
Visitors noticed too. Honda Engineers arriving during the joint-venture years were surprised by what they found. BMW, later, would privately acknowledge that Rover’s CAD-CAM and manufacturing IT capabilities were ahead of their own. ‘They just didn’t expect that level of integration,’ Leighfield (below) recalled. ‘Not from British Leyland.’
They did not call it the cloud – but the pattern is unmistakable: centralised computing power, distributed processing at the edge, high-speed private networks and real-time data driving decisions across a complex organisation.
British Leyland’s factories had started thinking.
Next: People, power and pride – and why brilliance on this scale still wasn’t enough.

[Editor’s Note: Otter.ai was used to record and transcribe the interview with John Leighfield, Chris Chiles and Dave Handley, and ChatGPT was used as an editorial aid during drafting to help structure, refine and grammar check this article as well as draft several the meta description suggestions. Photoshop was used to recrop and extend some backgrounds in the images, and then it was then human sub-edited. The interview and story were 100% human written.]
- History : Why British Leyland built BL Systems – Part Three : People, power and pride - 12 February 2026
- History : Why British Leyland built BL Systems – Part Two : The factory that thought - 11 February 2026
- History : Why British Leyland built BL Systems – Part One : Before the cloud - 10 February 2026




https://www.johntruslove.com/news/major-jobs-boost-for-redditch-on-the-cards-as-john-truslove-brings-landmark-site-to-market/
This is really fascinating stuff – sadly, though, it looks like the site this data centre sat on is now an Amazon distribution centre.
Hello Paul,
The site where CDC was is next to the current large Amazon warehouse . The buildings have been demolished and the microwave tower taken down. There does not appear to be any building work going on.
If you are interested, here is a video showing the demise of the Data centre and the demolising of the tower .
https://rvm.redditchheritage.online/05-IC/IC-100-Istel/00-index/istel-video.html
Forty-five years later, I’m stunned too. Who knew? This is proper ‘first draft of history’ journalism, and so unlike the narrative we’re used to hearing. But all strands of the BL story still converge in the familiar ending, and knowing that such flair and breadth of competence were squandered makes the outcome even sadder.